PAC-Bayesian Generalization Error Bounds for Gaussian Process Classification

نویسنده

  • Matthias Seeger
چکیده

Approximate Bayesian Gaussian process (GP) classification techniques are powerful nonparametric learning methods, similar in appearance and performance to Support Vector machines. Based on simple probabilistic models, they render interpretable results and can be embedded in Bayesian frameworks for model selection, feature selection, etc. In this paper, by applying the PAC-Bayesian theorem of ncitemcallester:99, we prove distribution-free generalization error bounds for a wide range of approximate Bayesian GP classification techniques. We instantiate and test these bounds for two particular GPC techniques, including a sparse method which circumvents the unfavourable scaling of standard GP algorithms. As is shown in experiments on a real-world task, the bounds can be very tight for moderate training sample sizes. To the best of our knowledge, these results provide the tightest known distribution-free error bounds for approximate Bayesian GPC methods, giving a strong learning-theoretical justification for the use of these techniques.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

PAC-Bayesian Theorems for Gaussian Process Classification

We present distribution-free generalization error bounds which apply to a wide class of approximate Bayesian Gaussian process classification (GPC) techniques, powerful nonparametric learning methods similar to Support Vector machines. The bounds use the PACBayesian theorem [8] for which we provide a simplified proof, leading to new insights into its relation to traditional VC type union bound t...

متن کامل

PAC-Bayesian Generalisation Error Bounds for Gaussian Process Classification

Approximate Bayesian Gaussian process (GP) classification techniques are powerful nonparametric learning methods, similar in appearance and performance to support vector machines. Based on simple probabilistic models, they render interpretable results and can be embedded in Bayesian frameworks for model selection, feature selection, etc. In this paper, by applying the PAC-Bayesian theorem of Mc...

متن کامل

Gaussian Processes Classification and its PAC-Bayes Generalization Error Bounds – CSE 291 Project Report

McAllester’s PAC-Bayes theorem (strengthened by [4]) characterizes the convergence of a stochastic classifier’s empirical error to its generalization error. Fixed one ”prior” distribution P (h) over hypothesis space H, the theorem can hold for all ”posterior” distribution Q(h) over H simultaneously, so in practice we can find a data-dependent posterior distribution overH as the distribution of ...

متن کامل

une approche PAC-Bayésienne PAC-Bayesian Statistical Learning Theory

This PhD thesis is a mathematical study of the learning task – specifically classification and least square regression – in order to better understand why an algorithm works and to propose more efficient procedures. The thesis consists in four papers. The first one provides a PAC bound for the L generalization error of methods based on combining regression procedures. This bound is tight to the...

متن کامل

Théorie Statistique de l’Apprentissage: une approche PAC-Bayésienne PAC-Bayesian Statistical Learning Theory

This PhD thesis is a mathematical study of the learning task – specifically classification and least square regression – in order to better understand why an algorithm works and to propose more efficient procedures. The thesis consists in four papers. The first one provides a PAC bound for the L generalization error of methods based on combining regression procedures. This bound is tight to the...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002